The (Honest) Truth About Dishonesty - Critical summary review - Dan Ariely
×

New Year, New You, New Heights. 🥂🍾 Kick Off 2024 with 70% OFF!

I WANT IT! 🤙
70% OFF

Operation Rescue is underway: 70% OFF on 12Min Premium!

New Year, New You, New Heights. 🥂🍾 Kick Off 2024 with 70% OFF!

870 reads ·  0 average rating ·  0 reviews

The (Honest) Truth About Dishonesty - critical summary review

The (Honest) Truth About Dishonesty Critical summary review Start your free trial
Science and Psychology

This microbook is a summary/original review based on the book: The (Honest) Truth About Dishonesty: How We Lie to Everyone – Especially Ourselves

Available for: Read online, read in our mobile apps for iPhone/Android and send in PDF/EPUB/MOBI to Amazon Kindle.

ISBN: 0062183613

Publisher: Harper

Critical summary review

If you consider yourself a good and honest person, then let Dan Ariely and his exceptional book, “The (Honest) Truth About Dishonesty,” burst your bubble: you are not, have never been, nor will ever be one. As the popular behavioral economist demonstrates through a plethora of social experiments, we are all cheating constantly. Fortunately – and contrary to common knowledge – we do this only to a certain extent, beyond which even money might not change our mind.

So, get ready to discover in what ways and to which extent we lie – and why we, ourselves, are the easiest people to fool.

The ring of Gyges… and Big Brother’s eyes

In the first chapter of Plato’s “Republic,” the brother of the philosopher, Glaucon, asks his renowned teacher Socrates whether a man will remain virtuous provided that his crimes are undetectable. To make his point clearer, he tells the story of the Gyges ring, which, when properly adjusted on the finger, granted invisibility to the one wearing it. Of course, once he found the ring and discovered its power, Gyges – then a mere shepherd – used it to infiltrate the court of the king, seduced the queen, eventually killed the ruler and crowned himself as the next one.

Now, Socrates doesn’t really answer Glaucon’s question in Plato’s work. He explains, at some length, that one’s virtue is not an independent feature of one’s character – but something directly related to the virtue of the state. In Socrates’ opinion, a man or a woman can only become virtuous if the state is – that is to say, only if they feel that someone is watching them. Grant them Gyges’ rings and, most probably, most of them will act dishonorably rather than righteous.

As frightening as Socrates’ assessment might sound in an increasingly democratic world, at least according to modern behavioral sciences, it might not be far off the mark. “To me, Plato’s myth offers a nice illustration of the notion that group settings can inhibit our propensity to cheat,” writes Ariely.

The author continues – “When we work within a team, other team members can act informally as monitors, and, knowing that we are being watched, we may be less inclined to act dishonorably.” This is actually an understatement: even if there are no people around, merely simulating someone’s presence yields ethical results. For example, in one famous experiment, replacing the image above an empty cash box from one with flowers to one with a pair of eyes caused a surge of donations. And you’re about to find out why.

The SMORC – and everything that’s wrong with it

If you know anything about Ariely or behavioral economics, then you already know that he believes that traditional economic theories are based on a serious study of human nature as much as the flat earth theory is based on a study of Earth’s circumference. 

Case in point: the SMORC, the Simple Model of Rational Crime. Its origin is pretty mundane. Gary Becker – a Nobel Prize-winning economist – was running late for a meeting once and couldn’t find a legal parking spot. So, he opted to park illegally. Why? Because he decided that the cost of missing the meeting was greater than the cost of being caught, fined, and possibly towed away. An important revelation followed the act: this cost-benefit analysis was just a streamlined version of how people instinctively think – and must be precisely what’s going on in the mind of most criminals.

The problem is – we now know for a fact that human thinking doesn’t work that way. Take this simple experiment: two groups tasked with solving 20 math assignments and told that they would receive $5 for each correct solution. There’s a catch, of course, as the participants in the experiment are told that only the worksheets of the first group will be reviewed, and as expected, the unreviewed tests do far better: knowing that their words will be taken for granted, the examinees from the second group claimed, on average, that they had solved six assignments when the other’s group average was four.

Later, the stakes were upped a bit: the participants were told they would be given $15 for each correct answer. According to SMORC, this should have inspired them to make even more outrageous claims. But that didn’t happen. Nothing changed: the averages remained the same. The most interesting part? Nothing changed even when the unreviewed group was allowed to shred their tests, thus completely eliminating the chance of being caught.

Two types of motivation – and the fudge factor theory

The reason why SMORC is wrong is pretty simple: the motivation to lie is merely one aspect of our character. “Honesty and dishonesty,” clarifies Ariely, “are based on a mixture of two very different types of motivation. On the one hand, we want to benefit from cheating (this is the rational economic motivation), while on the other, we want to be able to view ourselves as wonderful human beings (this is the psychological motivation).”

In other words, we want to both make an omelet and not break the eggs. Rationally, this is impossible. But, then again, we are not rational beings and, we are pretty capable of making compromises with ourselves. “Basically,” elucidates Ariely, “as long as we cheat just a little bit, we can have the cake and eat (some of) it too. We can reap some of the benefits of dishonesty while maintaining a positive image of ourselves.” And this is what Ariely terms the “fudge factor theory.”

In essence, it’s about balancing our motivations. We cannot help it: our psychological motivation – i.e., our innate wish to see ourselves as wonderful human beings – is just as powerful as our commonsensical economic drive – to earn more with less. And this is precisely why certain dilemmas – such as the amount of money we stand to gain and the probability of being caught – are actually, just part of the whole story: it’s not that we don’t make an instinctual cost-benefit analysis before deciding, it’s just that it influences our final choice much less than one might think it does.

Because there are numerous other forces, such as “moral reminders, distance from money, conflicts of interest, depletion, counterfeits, reminders of our fabricated achievements, creativity, witnessing others’ dishonest acts, caring about others on our team, and so on.” Let’s see how some work in practice.

May the Ten Commandments be engraved in you

To understand the other forces at question, let’s add another twist to the experiment with the two groups assigned with solving a math test: divide the second group of unreviewed examinees into two smaller groups. The first one was asked by the researchers to reread the Ten Commandments before taking the test. The other one was instructed to recall ten random books they had read and studied in high school.

As expected, the members of this second group cheated as much as they had previously. However, the group reminded of the Ten Commandments didn’t. At all. Apparently, it didn’t even matter if the people were religious or not. What mattered was the feeling that there was something more to cheating than the person’s own benefit.

Even so, here’s yet another example – this one for our atheist readers who can’t empathize with the just described one. A business consultant – actually, a comedian in disguise – comes to your school and gives you a lesson on how to cheat and get away with it. Do you: a) consider the advice helpful; b) have an uneasy feeling that a business consultant is telling you this; c) both of the above. 

For most people, it’s “c,” which is a clear demonstration of the fact that your decisions are governed by more than one motivation. “These bits of advice are good for you,” says the rational voice in your head. “Wait a second,” replies the emotional voice, “…but isn’t it strange that you’re being told this by a business consultant? Is this how companies are run in this country?”

One step removed from the money

Another factor that contributes to our moral decisions is the distance from money – as can be seen from this experiment.

In an MIT dorm, half of the communal refrigerators are stacked with six-packs of Coca-Cola; the other half with paper plates containing six $1 bills. Of course, all of the students know that neither the Cokes nor the dollar bills belong to them. Just as well, they all know – or, at least, have a vague idea – that the actual worth of the two is pretty much the same: you can buy a can of Coke for a dollar. What do you think happened after three days?

As you might have already guessed, even though all of the Cokes were gone, not one of the dollar bills was touched. Nothing stopped the students from taking the dollar bills, walking over to a nearby vending machine and getting a Coke and some change. But, for some reason, no one did. And that reason was the psychological motivation: “the fudge factor” increases the greater the emotional distance is between the dishonest act and its consequences.

In plainer words, it is easier for you to convince yourself that you are a good person if you need just one step of rationalization to do that, like taking a Coke from a refrigerator. However, if it takes you more than one step – taking some money and, then, buying a Coke – it’s more difficult to digest this psychologically. It’s the difference between murdering someone with malice and murdering someone in a state of affectation. 

There are two different punishments for that because, in the first case, you had the time to change your mind. The caveat? “From all the research I have done over the years,” frets Ariely, “the idea that worries me the most is that the more cashless our society becomes, the more our moral compass slips.”

Limiting dishonesty – and resetting the moral compass

Knowing the above, limiting dishonesty shouldn’t be that difficult: merely nudging the people in the right direction should do the trick. Take this real-life case, for example. 

A woman in South America noticed that her maid had been stealing a little bit of meat from the freezer every few days. She was rich enough to not care about this too much, but since she didn’t like being cheated, she decided to do something about it. Nothing spectacular: she neither hired a private detective nor fired the maid. She just put a lock on the freezer and told the maid that she suspected someone was stealing from the freezer. Then, she gave her a key to the lock and a small financial promotion, adding that the two of them are the only ones who now have the key and must be watchful.

Expectedly, this worked! Because, by putting a lock on the freezer, the woman both put a check on the maid’s temptations and created a psychological distance. Now, stealing the meat would mean making two steps (unlocking the freezer and then taking some meat), and that’s not as easy to rationalize. Finally, by singling out the maid as the most trustful person in the household, the woman put a moral burden on her, similar to the one the Ten Commandments put on the math-testers. And that made all the difference.

So, we should focus on creating similar social policies. “The more we develop and adopt such mechanisms,” concludes Ariely, “the more we will be able to curb dishonesty. It is not always going to be simple, but it is possible.”

Final Notes

“The (Honest) Truth About Dishonesty” is everything you’d expect from a book written by Dan Ariely (or any other popular behavioral economist, for that matter): thought-provoking, counterintuitive, and filled to the brim with social experiments.

“Required reading for politicians and Wall Street executives,” says a Booklist review. Required reading for everyone, we add.

12min Tip

Act as if someone – preferably your mother or God if you are religious – is watching you at all times. Science says you can’t really trust yourself otherwise.

Sign up and read for free!

By signing up, you will get a free 7-day Trial to enjoy everything that 12min has to offer.

Who wrote the book?

Dan Ariely is an Israeli American cognitive psychologist and bestselling author, the James B. Duke professor of psychology and behavioral economics at Duke University. Aiming to translate his scientific findings into lucrative business opportunities, he also founded a research institution and several successful startups (... (Read more)

Start learning more with 12min

6 Milllion

Total downloads

4.8 Rating

on Apple Store and Google Play

91%

of 12min users improve their reading habits

A small investment for an amazing opportunity

Grow exponentially with the access to powerful insights from over 2,500 nonfiction microbooks.

Today

Start enjoying 12min's extensive library

Day 5

Don't worry, we'll send you a reminder that your free trial expires soon

Day 7

Free Trial ends here

Get 7-day unlimited access. With 12min, start learning today and invest in yourself for just USD $4.14 per month. Cancel before the trial ends and you won't be charged.

Start your free trial

More than 70,000 5-star reviews

Start your free trial

12min in the media